Learn R Programming

frailtypack (version 2.3)

frailtyPenal for Nested frailty models: Fit a Nested Frailty model using a semiparametric penalized likelihood estimation or a parametric estimation

Description

Fit a nested frailty model using a Penalized Likelihood on the hazard function or using a parametric estimation. Nested frailty models allow survival studies for hierarchically clustered data by including two iid gamma random effects. Left truncated and censored data are allowed. Stratification analysis is allowed(maximum of strata=2). The hazard function conditional on the two frailties $v_i$ and $w_{ij}$ for the $k^{th}$ individual of the $j^{th}$ subgroup of the $i^{th}$ group is : $$\left{ \begin{array}{ll} \lambda_{ijk}(t|v_i,w_{ij})=v_iw_{ij}\lambda_0(t)exp(\bold{\beta^{'}X_{ijk}}) \ v_i\sim\Gamma\left(\frac{1}{\alpha},\frac{1}{\alpha}\right) \hspace{0.05cm}i.i.d. \hspace{0.2cm} \bold{E}(v_i)=1 \hspace{0.2cm}\bold{Var}(v_i)=\alpha \ w_{ij}\sim\Gamma\left(\frac{1}{\eta},\frac{1}{\eta}\right)\hspace{0.05cm}i.i.d. \hspace{0.2cm} \bold{E}(w_{ij})=1 \hspace{0.2cm} \bold{Var}(w_{ij})=\eta \end{array} \right.$$ where $\lambda_0(t)$ is the baseline hazard function, $X_{ijk}$ denotes the covariate vector and $\beta$ the corresponding vector of regression parameters.

Arguments

formula
a formula object, with the response on the left of a $\texttildelow$ operator, and the terms on the right. The response must be a survival object as returned by the 'Surv' function like in survival package. The subcluster()
formula.terminalEvent
Not required.
data
a 'data.frame' in which to interpret the variables named in the 'formula'.
Frailty
Logical value. Is model with frailties fitted? If so, variance of frailty parameter is estimated. If not, Cox proportional hazards model is estimated using Penalized Likelihood on the hazardfunction. The default is FALSE.
joint
Not required
recurrentAG
Logical value. Is Andersen-Gill model fitted? If so indicates that recurrent event times with the counting process approach of Andersen and Gill is used. This formulation can be used for dealing with time-dependent cov
cross.validation
Logical value. Is cross validation procedure used for estimating smoothing parameter in the penalized likelihood estimation? If so a search of the smoothing parameter using cross validation is done, with kappa1 as the
n.knots
integer giving the number of knots to use. Value required in the penalized likelihood estimation. It corresponds to the (n.knots+2) splines functions for the approximation of the hazard or the survival functions. Number of knots must be between 4 a
kappa1
positive smoothing parameter. The coefficient kappa of the integral of the squared second derivative of hazard function in the fit (penalized log likelihood). To obtain an initial value for kappa1 (or kappa2), a solu
kappa2
positive smoothing parameter in the penalized likelihood estimation for the second stratum, when data are stratified. See kappa1.
maxit
maximum number of iterations for the Marquardt algorithm. Default is 350
hazard
Type of hazard functions: "Splines" for semiparametric hazard function with the penalized likelihood estimation, "Piecewise-per" for piecewise constant hazard function using percentile, "Piecewise-equi" for piecewise constant hazard function using equidis
nb.int1
Number of intervals (between 1 and 20) for the parametric hazard functions ("Piecewise-per", "Piecewise-equi")
RandDist
Not implemented for nested frailty model

Value

  • a Nested frailty model or more generally an object of class 'frailtyPenal'. Methods defined for 'frailtyPenal' objects are provided for print, plot and summary. The following components are included in a 'frailtyPenal' object for nested frailty models.
  • bsequence of the corresponding estimation of the splines coefficients, the random effects variances and the regression coefficients.
  • alphavariance of the cluster effect $(\bold{Var}(v_{i}))$
  • callThe code used for fitting the model.
  • coefthe regression coefficients.
  • cross.ValLogical value. Is cross validation procedure used for estimating the smoothing parameters?
  • DoFdegrees of freedom.
  • etavariance of the subcluster effect $(\bold{Var}(w_{ij}))$
  • formulathe formula part of the code used for the model.
  • groupsthe maximum number of groups used in the fit.
  • subgroupsthe maximum number of subgroups used in the fit.
  • kappaA vector with the smoothing parameters corresponding to each baseline function as components.
  • lammatrix of hazard estimates and confidence bands.
  • lam2the same value as lam for the second stratum.
  • loglikPenalthe complete marginal penalized log-likelihood in the semiparametric case.
  • loglikthe marginal log-likelihood in the parametric case.
  • nthe number of observations used in the fit.
  • n.eventsthe number of events observed in the fit.
  • n.iternumber of iterations needed to converge.
  • n.knotsnumber of knots for estimating the baseline functions.
  • n.stratA vector with the number of covariates of each type of hazard function as components.
  • survmatrix of baseline survival estimates and confidence bands.
  • surv2the same value as surv for the second stratum.
  • varHthe variance matrix of all parameters before positivity constraint transformation (alpha, eta, the regression coefficients and the spline coefficients). Thenafter, the delta method is needed to obtain the estimated variance parameters.
  • varHIHthe robust estimation of the variance matrix of all parameters (alpha, eta, the regression coefficients and the spline coefficients).
  • x1vector of times where both survival and hazard functions are estimated. By default seq(0,max(time),length=99), where time is the vector of survival times.
  • x2vector of times for the second stratum (see x1 value).
  • type.of.hazardType of hazard functions (0:"Splines", "1:Piecewise", "2:Weibull").
  • type.of.PiecewiseType of Piecewise hazard functions (1:"percentile", 0:"equidistant").
  • nbintervRNumber of intervals (between 1 and 20) for the parametric hazard functions ("Piecewise-per", "Piecewise-equi").
  • nparnumber of parameters.
  • nvarnumber of explanatory variables.
  • noVarindicator of explanatory variables.
  • LCVthe approximated likelihood cross-validation criterion in the semiparametric case (with H minus the converged hessien matrix, and l(.) the full log-likelihood).$$LCV=\frac{1}{n}(trace(H^{-1}_{pl}H) - l(.))$$
  • AICthe Akaike information Criterion for the parametric case.$$AIC=\frac{1}{n}(np - l(.))$$
  • n.knots.tempinitial value for the number of knots.
  • shape.weibshape parameters for the weibull hazard function.
  • scale.weibscale parameters for the weibull hazard function.
  • martingale.resmartingale residuals for each cluster.
  • frailty.pred.groupempirical Bayes prediction of the frailty term by group.
  • frailty.pred.subgroupempirical Bayes prediction of the frailty term by subgroup.
  • linear.predlinear predictor: uses "Beta'X + log v_i.w_ij" in the Nested Frailty models.
  • subgbygsubgroup by group.
  • global_chisqa vector with the values of each multivariate Wald test.
  • dof_chisqa vector with the degree of freedom for each multivariate Wald test.
  • global_chisq.testa binary variable equals to 0 when no multivariate Wald is given, 1 otherwise.
  • p.global_chisqa vector with the p_values for each global multivariate Wald test.
  • names.factorNames of the "as.factor" variables.
  • Xlevelsvector of the values that factor might have taken.
  • contraststype of contrast for factor variable.

synopsis

frailtyPenal(formula, formula.terminalEvent, data, Frailty = FALSE, joint = FALSE, recurrentAG = FALSE, cross.validation = FALSE, n.knots, kappa1, kappa2, maxit = 350, hazard = "Splines", nb.int1, nb.int2, RandDist = "Gamma")

Details

The estimated parameter are obtained by maximizing the penalized loglikelihood using the robust Marquardt algorithm (Marquardt, 1963) which is a combination between a Newton-Raphson algorithm and a steepest descent algorithm. When frailty parameter is small, numerical problems may arise. To solve this problem, an alternative formula of the penalized log-likelihood is used (see Rondeau, 2003 for further details). Cubic M-splines of order 4 are used for the hazard function, and I-splines (integrated M-splines) are used for the cumulative hazard function. The smoothing parameter can be fixed a priori or chosen by maximizing a likelihood cross validation criterion. The iterations are stopped when the difference between two consecutive loglikelhoods was small $(<10^{-4})$, 10="" the="" estimated="" coefficients="" were="" stable="" (consecutive="" values="" $(<10^{-4})$,="" and="" gradient="" small="" enough="" $(<10^{-3})$.="" to="" be="" sure="" of="" having="" a="" positive="" function="" at="" all="" stages="" algorithm,="" spline="" reparametrizes="" each="" stage.="" variance="" space="" two="" random="" effects="" is="" reduced,="" so="" variances="" are="" positive,="" correlation="" coefficient="" constrained="" between="" -1="" 1.="" integrations="" in="" full="" log="" likelihood="" werre="" evaluated="" using="" gaussian="" quadrature.="" laguerre="" polynomials="" with="" points="" used="" treat="" on="" $[0,\infty[$.="" INITIAL VALUES The splines and the regression coefficients are initialized to 0.1. The program fits an adjusted Cox model to provide new initial values for the regression and the splines coefficients. The variances of the frailties are initialized to 0.1. Then, a shared frailty model with covariates with only subgroup frailty is fitted to give a new initial value for the variance of the subgroup frailty term. Then, a shared frailty model with covariates and only group frailty terms is fitted to give a new initial value for the variance of the group frailties. In a last step, a nested frailty model is fitted.

References

V. Rondeau, Y. Mazroui and J. R. Gonzalez (2012). Frailtypack: An R package for the analysis of correlated survival data with frailty models using penalized likelihood estimation or parametric estimation. Journal of Statistical Software 47, 1-28. V. Rondeau, L. Filleul, P. Joly (2006). Nested frailty models using maximum penalized likelihood estimation. Statistics in Medecine, 25, 4036-4052. V. Rondeau, D Commenges, and P. Joly (2003). Maximum penalized likelihood estimation in a gamma-frailty model. Lifetime Data Analysis 9, 139-153. D. Marquardt (1963). An algorithm for least-squares estimation of nonlinear parameters. SIAM Journal of Applied Mathematics, 431-441.

See Also

print.nestedPenal, summary.nestedPenal, plot.nestedPenal, cluster, subcluster, strata

Examples

Run this code
### Nested model (or hierarchical model) with 2 covariates ###

data(dataNested)
modClu <- frailtyPenal(Surv(t1,t2,event)~cluster(group)+
             subcluster(subgroup)+cov1+cov2,Frailty=TRUE,
             data=dataNested,n.knots=8,kappa1=50000)

# It takes around 24 minutes to converge (depends on the processor)#

modClu.str <- frailtyPenal(Surv(t1,t2,event)~cluster(group)+
                subcluster(subgroup)+cov1+strata(cov2),Frailty=TRUE,
                data=dataNested,n.knots=8,kappa1=20000,kappa2=20000)

# It takes around 8 minutes to converge (depends on the processor)#

Run the code above in your browser using DataLab